Robust Autoregression: Student-t Innovations Using Variational Bayes
نویسندگان
چکیده
منابع مشابه
Robust Inference with Variational Bayes
In Bayesian analysis, the posterior follows from the data and a choice of a prior and a likelihood. One hopes that the posterior is robust to reasonable variation in the choice of prior and likelihood, since this choice is made by the modeler and is necessarily somewhat subjective. For example, the process of prior elicitation may be prohibitively time-consuming, two practitioners may have irre...
متن کاملEmpirical Bayes Inference for Means Using Student t Prior Distributions
Hierarchical models for L studies, domains or experiments often assume that the study means have a common normal population distribution. However, modeling normal sampling distributions with a normal population distribution may overstate the level of exchangeability of the studies. Using heavy tailed population distributions, in particular t distributions, provide some protection from combining...
متن کاملA Variational Bayes Approach to Robust Principal Component Analysis
We solve the Robust Principal Component Analysis problem: decomposing an observed matrix into a low-rank matrix plus a sparse matrix. Unlike alternative methods that approximate this l0 objective with an l1 objective and solve a convex optimization problem, we develop a corresponding generative model and solve a statistical inference problem. The main advantages of this approach is its ability ...
متن کاملVariational inference for Student-t MLP models
This paper presents a novel methodology to infer parameters of probabilistic models whose output noise is a Student-t distribution. The method is an extension of earlier work for models that are linear in parameters to nonlinear multi-layer perceptrons (MLPs). We used an EM algorithm combined with variational approximation, the evidence procedure, and an optimisation algorithm. The technique wa...
متن کاملAuto-Encoding Variational Bayes
How can we perform efficient inference and learning in directed probabilistic models, in the presence of continuous latent variables with intractable posterior distributions, and large datasets? We introduce a stochastic variational inference and learning algorithm that scales to large datasets and, under some mild differentiability conditions, even works in the intractable case. Our contributi...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: IEEE Transactions on Signal Processing
سال: 2011
ISSN: 1053-587X,1941-0476
DOI: 10.1109/tsp.2010.2080271